skip to main content


Search for: All records

Creators/Authors contains: "Hong, Jason I."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. People work with AI systems to improve their decision making, but often under- or over-rely on AI predictions and perform worse than they would have unassisted. To help people appropriately rely on AI aids, we propose showing them behavior descriptions, details of how AI systems perform on subgroups of instances. We tested the efficacy of behavior descriptions through user studies with 225 participants in three distinct domains: fake review detection, satellite image classification, and bird classification. We found that behavior descriptions can increase human-AI accuracy through two mechanisms: helping people identify AI failures and increasing people's reliance on the AI when it is more accurate. These findings highlight the importance of people's mental models in human-AI collaboration and show that informing people of high-level AI behaviors can significantly improve AI-assisted decision making.

     
    more » « less
  2. Algorithmic systems help manage the governance of digital platforms featuring user-generated content, including how money is distributed to creators from the profits a platform earns from advertising on this content. However, creators producing content about disadvantaged populations have reported that these kinds of systems are biased, having associated their content with prohibited or unsafe content, leading to what creators believed were error-prone decisions to demonetize their videos. Motivated by these reports, we present the results of 20 interviews with YouTube creators and a content analysis of videos, tweets, and news about demonetization cases to understand YouTubers' perceptions of demonetization affecting videos featuring disadvantaged or vulnerable populations, as well as creator responses to demonetization, and what kinds of tools and infrastructure support they desired. We found creators had concerns about YouTube's algorithmic system stereotyping content featuring vulnerable demographics in harmful ways, for example by labeling it "unsafe'' for children or families -- creators believed these demonetization errors led to a range of economic, social, and personal harms. To provide more context to these findings, we analyzed and report on the technique a few creators used to audit YouTube's algorithms to learn what could cause the demonetization of videos featuring LGBTQ people, culture and/or social issues. In response to the varying beliefs about the causes and harms of demonetization errors, we found our interviewees wanted more reliable information and statistics about demonetization cases and errors, more control over their content and advertising, and better economic security. 
    more » « less
  3. Apple announced the introduction of app privacy details to their App Store in December 2020, marking the frst ever real-world, large-scale deployment of the privacy nutrition label concept, which had been introduced by researchers over a decade earlier. The Apple labels are created by app developers, who self-report their app’s data practices. In this paper, we present the frst study examining the usability and understandability of Apple’s privacy nutrition label creation process from the developer’s perspective. By observing and interviewing 12 iOS app developers about how they created the privacy label for a real-world app that they developed, we identified common challenges for correctly and efciently creating privacy labels. We discuss design implications both for improving Apple’s privacy label design and for future deployment of other standardized privacy notices. 
    more » « less
  4. We present Peekaboo, a new privacy-sensitive architecture for smart homes that leverages an in-home hub to pre-process and minimize outgoing data in a structured and enforceable manner before sending it to external cloud servers. Peekaboo’s key innovations are (1) abstracting common data preprocessing functionality into a small and fixed set of chainable operators, and (2) requiring that developers explicitly declare desired data collection behaviors (e.g., data granularity, destinations, conditions) in an application manifest, which also specifies how the operators are chained together. Given a manifest, Peekaboo assembles and executes a pre-processing pipeline using operators pre-loaded on the hub. In doing so, developers can collect smart home data on a need-to-know basis; third-party auditors can verify data collection behaviors; and the hub itself can offer a number of centralized privacy features to users across apps and devices, without additional effort from app developers. We present the design and implementation of Peekaboo, along with an evaluation of its coverage of smart home scenarios, system performance, data minimization, and example built-in privacy features. 
    more » « less
  5. Account sharing is a common, if officially unsanctioned, practice among workgroups, but so far understudied in higher education. We interview 23 workgroup members about their account sharing practices at a U.S. university. Our study is the first to explicitly compare IT and non-IT observations of account sharing as a "normal and easy" workgroup practice, as well as to compare student practices with those of full-time employees. We contrast our results with those in prior works and offer recommendations for security design and for IT messaging. Our findings that account sharing is perceived as low risk by our participants and that security is seen as secondary to other priorities offer insights into the gap between technical affordances and social needs in an academic workplace such as this.? 
    more » « less
  6. Since December 2020, the Apple App Store has required all developers to create a privacy label when submitting new apps or app updates. However, there has not been a comprehensive study on how developers responded to this requirement. We present the frst measurement study of Apple privacy nutrition labels to understand how apps on the U.S. App Store create and update privacy labels. We collected weekly snapshots of the privacy label and other metadata for all the 1.4 million apps on the U.S. App Store from April 2 to November 5, 2021. Our analysis showed that 51.6% of apps still do not have a privacy label as of November 5, 2021. Although 35.3% of old apps have created a privacy label, only 2.7% of old apps created a privacy label without app updates (i.e., voluntary adoption). Our findings suggest that inactive apps have little incentive to create privacy labels. 
    more » « less
  7. In this paper, we studied people’s smart home privacy-protective behaviors (SH-PPBs), to gain a better understanding of their privacy management do’s and don’ts in this context. We first surveyed 159 participants and elicited 33 unique SH-PPB practices, revealing that users heavily rely on ad hoc approaches at the physical layer (e.g., physical blocking, manual powering off). We also characterized the types of privacy concerns users wanted to address through SH-PPBs, the reasons preventing users from doing SH-PPBs, and privacy features they wished they had to support SH-PPBs. We then storyboarded 11 privacy protection concepts to explore opportunities to better support users’ needs, and asked another 227 participants to criticize and rank these design concepts. Among the 11 concepts, Privacy Diagnostics, which is similar to security diagnostics in anti-virus software, was far preferred over the rest. We also witnessed rich evidence of four important factors in designing SH-PPB tools, as users prefer (1) simple, (2) proactive, (3) preventative solutions that can (4) offer more control. 
    more » « less